Why deterministic logic is hard to learn but Statistical Relational Learning works
نویسنده
چکیده
Consider a standard regression setup: we have data {(x, y)}. See figure 1 – a story I am sure you already know... (a) assumes that there is no uncertainty in the world/data and tries to model the data. This assumption ties you down to model every observation literally; e.g., in a Bayesian setup with a GP without observation noise. Eventually, this prohibits generalization, simplification, abstraction, regularization, finding a compact description, etc. On the contrast, (b) assumes uncertainty in the observations. This uncertainty (e.g., in a GP prior) is the key that allows you to fit a generalizing, smooth, regularized, abstracting,... function.
منابع مشابه
CPT-L: an Efficient Model for Relational Stochastic Processes
Agents that learn and act in real-world environments have to cope with both complex state descriptions and non-deterministic transition behavior of the world. Standard statistical relational learning techniques can capture this complexity, but are often inefficient. We present a simple probabilistic model for such environments based on CP-Logic. Efficiency is maintained by restriction to a full...
متن کاملOn Continuous Distributions and Parameter Estima- tion in Probabilistic Logic Programs
In the last decade remarkable progress has been made on combining statistical machine learning techniques, reasoning under uncertainty, and relational representations. The branch of Artificial Intelligence working on the synthesis of these three areas is known as statistical relational learning or probabilistic logic learning. ProbLog, one of the probabilistic frameworks developed, is an extens...
متن کاملParameter and Structure Learning Algorithms for Statistical Relational Learning
My research activity focuses on the field of Machine Learning. Two key challenges in most machine learning applications are uncertainty and complexity. The standard framework for handling uncertainty is probability, for complexity is first-order logic. Thus we would like to be able to learn and perform inference in representation languages that combine the two. This is the focus of the field of...
متن کاملLearning Relational Probabilistic Models from Partially Observed Data - Opening the Closed-World Assumption
Recent years have seen a surge of interest in learning the structure of Statistical Relational Learning (SRL) models that combine logic with probabilities. Most of these models apply the closed-world assumption i.e., whatever is not observed is false in the world. In this work, we consider the problem of learning the structure of SRL models in the presence of hidden data i.e. we open the closed...
متن کاملInductive Logic Programming meets Relational Databases: An Application to Statistical Relational Learning
With the increasing amount of relational data, scalable approaches to faithfully model this data have become increasingly important. Statistical Relational Learning (SRL) approaches have been developed to learn in presence of noisy relational data by combining probability theory with first order logic. However most learning approaches for these models do not scale well to large datasets. While ...
متن کامل